Some Results on Weakly Accurate Base Learners for Boosting Regression and Classification
نویسنده
چکیده
One basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generatèweak' (or more appropriately, `weakly accurate') hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the angular span for the base hypothesis space. The exponential convergence rates of boosting algorithms are shown to be bounded below by essentially the angular spans. Suucient conditions for nonzero angular span are also given and validated for a wide class of regression and classiication systems.
منابع مشابه
VipBoost: A More Accurate Boosting Algorithm
Boosting is a well-known method for improving the accuracy of many learning algorithms. In this paper, we propose a novel boosting algorithm, VipBoost (voting on boosting classifications from imputed learning sets), which first generates multiple incomplete datasets from the original dataset by randomly removing a small percentage of observed attribute values, then uses an imputer to fill in th...
متن کاملOn Weak Base Learners for Boosting Regression and Classiication on Weak Base Learners for Boosting Regression and Classiication
The most basic property of the boosting algorithm is its ability to reduce the training error, subject to the critical assumption that the base learners generate weak hypotheses that are better that random guessing. We exploit analogies between regression and classiication to give a characterization on what base learners generate weak hypotheses, by introducing a geometric concept called the an...
متن کاملEnsemble Learning
An ensemble contains a number of learners which are usually called base learners. The generalization ability of an ensemble is usually much stronger than that of base learners. Actually, ensemble learning is appealing because that it is able to boost weak learners which are slightly better than random guess to strong learners which can make very accurate predictions. So, “base learners” are als...
متن کاملOn Weak Base Hypotheses and Their Implications for Boosting Regression and Classification By
When studying the training error and the prediction error for boosting, it is often assumed that the hypotheses returned by the base learner are weakly accurate, or are able to beat a random guesser by a certain amount of difference. It has been an open question how much this difference can be, whether it will eventually disappear in the boosting process or be bounded by a positive amount. This...
متن کاملOn Weak Base Hypotheses and Their Implications
1 2 When studying the training error and the prediction error for boosting, it is often assumed that the hypotheses returned by the base learner are weakly accurate, or are able to beat a random guesser by a certain amount of diierence. It is has been an open question how much this diierence can be, whether it will eventually disappear in the boosting process or be bounded by a nite amount see,...
متن کامل